perm filename BLURB[P,JRA] blob sn#396755 filedate 1978-11-20 generic text, type C, neo UTF8
COMMENT ⊗   VALID 00006 PAGES
C REC  PAGE   DESCRIPTION
C00001 00001
C00002 00002	.sec(A Low Cost Development System)
C00011 00003	Another component of the "impedance mismatch" in interactive systems
C00016 00004	.SS(The Proposal)
C00023 00005	.SS(Comparison with Current LCDS Proposal)
C00026 00006	.SS(General Comments and Summary)
C00032 ENDMK
C⊗;
.sec(A Low Cost Development System)
.SS(Introduction)
The creation of a truly integrated development system
is a complex task. We cannot expect to specify a design based on
simple combinations of known and  established components and proceed to implement
that in a step-by-step  fashion. Design is an iterative process; given
an initial implementation, we must be prepared for extensive experimentation
and modification. 
Such exploratory behavior is a necessary
ingredient in the development of a highly interactive system. 
We should strive for an interactive system which 
 contains some ability to aid and guide the user. Such systems are called 
"reactive". For example, we must cater for on-line documentation which is
immediately available within any development cycle without terminating the 
current investigation.  We must supply "teaching modes" for any subsystems
we include in the product. We must design for future enhancements which
will permit the system to monitor the user's behavior and, if requested,
become an active participant in the development process. Such systems
are now common  within the Artificial Intelligence community. 
These AI systems are used
for the development  of systems whose complexity will soon be rivaled
by that of the next generation microprocessor application. 
We should prepare now for that eventuality.

Since
our system's major contribution must be in the area of human interaction,
we must begin the "human interface breadboard" immediately. 
The interface
 must be "forgiving"; it must "do what you expect"; it must be
"natural".  All of these criteria involve the human interface.
Such system characteristics are, of course, also highly subjective. People
view new ideas in terms of their past experience. Until these ideas are
tested against their more traditional alternatives we cannot make meaningful
judgements about their merits.
Therefore it is of utmost importance that we begin our iterative design immediately.
To that end, this paper describes some of the external characteristics 
that our system  must have.  We include a discussion of some of the new
tools, describing
their increased  leverage  in solving problems.
We discuss software and hardware
requirements, including extensions of the concepts to second generation development
systems.
To a  large
extent, this activity is independent of processor used⊗↓Certainly
there are some  minimal requirements in terms of speed and space;
we will deal with those issues later in the discussion.←; and since 
 the system is Pascal-based, the breadboard software will be transportable.

The critical bottleneck in interactive systems is the "impedance mismatch".
For example, the user has a high bandwidth device called an eye. Most systems
fail to take advantage of the processing speed of the eye, supplying data
at rates  and quantities much below that which the eye could assimilate. 
We attack this bottleneck
by specifying that the major output device be a display capable of being
manipulated at video rates; for example, able to generate
a 4000 character window in 1/30 second⊗↓Such an interface does not
necessarily imply added cost. Several systems are currently available (or
planned) which are based on simple modifications to home television sets;
more will be said about this shortly.←.
A high speed video display offers several advantages. The text editor
can be page oriented; we can present text files a screen-at-a-time
rather than a-line-at-a-time. This gives the user a much broader
picture of the text file, making both program construction and text
production easier and more natural. Also since the screen presents a
"window" into the infornation, the editor can easily conform to
"what you see is what you get"; that is, the screen will not have
editing commands interspursed with text. Insertion and deletion
commands will have an immediate visual effect. These phenomena
are reasonable common now; however more global text manipulation and formating
like justification or cutting-and-pasting can also be done visually
with a similar improvement in user/machine interaction.

Video techniques offer distinct advantages in program development as well as
text development. We can implement the machine language debugger as a 
"dynamic window" into the state of the machine, using keyboard commands to 
prescribe actions like "single step", "execute", or examine  a location in various
bases (symbolic, ascii, octal, hex, ...). Again, the key is to present
information in a highly user oriented format reinforcing the
"what you see is what you get" philosophy.

The machine language display debugger can be integrated with the display
editor in several important ways. First, the system should support
on-line documentation. A good display editor is the natural medium
for supplanting hardcopy manuals.
Thus, if the the user forgets a debugging command, he should be able to suspend
the debugging session and use the editor to peruse the documentation for the
debugger. With his recollection refreshed,
 the user can then return to the suspended task.
This context switching can be implemented either by 
split screens or by a screen swap; in either case, the screen modification 
must be immediate. A very intriguing problem is the extension of this display
philosophy to the area of high level language development. Most systems
compromise the debugging aspects rather severely, requiring the user to
debug in terms of the underlying machine, rather in terms of the
high level constructs. Such behavior is unacceptable.

Another component of the "impedance mismatch" in interactive systems
involves the control language.  The user must be able to communicate
intentions in a minimal way; the command language must not be verbose.
An interactive system should "play like a musical instrument". Simple
keystrokes should ellicit immediate response. Imagine playing a flute
which required  preprogramming before any sound was issued; imagine
playing a flute which embodied a delay of a second between the pressing of a
key and the production of a note. Such behavior is unthinkable
yet we have come to expect  such behavior in our 
computer systems. Part of the difficulty
is fundamental in the amount of computation needed to perform the
request; however it is too often the case that the difficulty is one of human
engineering. Simple requests should involve simple commands and should
take effect immediately. 

Most of the characteristics which I expect our
system to possess are well within reach of current 16-bit 
 micro computers. We must realize the increased productivity 
which these interactive systems offer, create a prototype for our own needs, 
and begin to refine that system as the basis of our product.

It is particularly exciting to realize that the increased power of such 
systems  does not imply an increase in cost. To some extent, even the
opposite is true. Since we can depend of the visual feedback
to position ourselves in text, or to recognize
anamolous conditions in a debugging sessions, we need
not make our command languages as ponderous; complex string search commands
and detailed breakpoint conditions can be de-emphasized.
We can expect to guide the machine like we drive our cars: having a
general plan in mind, yet able to react to uncertainty and modify our behavior.


In summary, it will be the "friendliness" of the system which will
spell success or failure. The key to friendliness is the communication
bandwidth between user and machine. The current cost-effective
vehicle for such communication
is the integrated use of video techniques. We %3must%1 break with the
traditional hardcopy devices and the interent limitations  which
their use imposes on our thinking; glass teletypes are %3not%1
interactive displays.  Our course must be charted carefully;  besides
introducing a new way of thinking, we are programming with a new language 
and its associated programming methodology. We are combining new
hardware ingredients, even perhaps to the level of casting our lot with
an untried processor. Clearly, caution must be exercised.  However,
caution must be applied with caution. A conservative effort which
breaks no new ground is useless. In the next section I describe 
how we can achieve our objective using a systems approach based
on the parallel development of the prototype harware and prototype
software.
.SS(The Proposal)
The development cycle can be broken into several semi-independent 
 activities: hardware integration, 
language implementation, operating system implementation, and human
interface definition. This  observation, in itself, is neither
new nor particularly helpful. What %3is%1 useful is the observation
that a large part of each activity can be carried on in parallel; this
is mainly due to the use of a machine independent implementation language.
Most of our software can be developed independently of the final
host; this will allow us to postpose, if necessary, a decision on a %7m%1-processor
until much later in the program.

Since we will have access to a Pascal implementation, at least on dial-up
lines, the staff members can begin familarizing themselves with  the
characteristics of the language. In fact, much of the actual programming
could be done with such facilities. However, such a system is totally
unacceptable for experimenting with the interactive nature of our system.
We must have an in-house system which can be molded into a reasonable
approximation of the final product. This prototype need not have the
same processor as the  product, but it must be  capable of mimicing the
external behavior of our system. This involves both software and hardware.
Fortunately, such a system will soon exist⊗↓If it didn't we would either have to
buy one or build one.←: the CIT Pascal micro-engine%8TM%1.

The Pascal micro-engine%8TM%1 is based on the Western Digital version of the
WCS LSI-11. It comes with  the full complement of UCSD Pascal. 
It will be available for evaluation of %3its%1 interactive chracteristics.
I propose that we can also use it for a breadboard of %3our%1 interactive
system. This entails adding a video interface to support our screen
behavior, and writing an editor and debugger which use that screen.

The video interface will be immediately transfep↓rle to our product,
and I will write the editor in Pascal, allowing it to be transfered
to the LCDS or any future system. The experience of writing the editor
will illuminate several important questions. First, the degree of 
machine-independence we can expect in time-critical programs like editors.
Next, I will be able to define display primitives which will need
special care within any operating system. At a higher level, improvements
and extensions of the basic Stanford editor may be desirable; for example
we may want some language dependent features, either for natural
language (word processing) or programming language (Pascal).

The video interface will also be an admirable test-bed for developing
our high-level debugging package. That will use the display as a window
on a Pascal machine. I will write that in Pascal as well, thereby
making it available to other systems.

The keyboard interface will also involve exploration and experimentation.
We should investigate relative merits of diverse techniques like
cursor pads, Stanford's control keys, and the SRI-Xerox mouse, as
well as other devices. Very little study has been given to the
human aspects of input, though Xerox has done some.
I have some ideas which I would like to try.
Such experimentation may pay off handsomely.

The modifications to the Pascal micro-engine%8TM%1 appear to be
minimal, both in terms of time and cost. A Stanford group is
giving me the details of a %7m%1-processor based terminal system
which drives a home TV-set at 14MHZ; their results are very impressive.⊗↓It 
may be possible to incorporate the Screen-splitter%8TM%1 ideas as well;
however that would be a "second pass" on the editor.←.

I would expect to work closely with other members of the  Systems
Department: architecture and hardware, as well as software. I would 
like to transfer  both the software technology as well as the programming
methodology to others; there should be valuable information
for the  architecture group in our experience with high-level languages
and processors which cater to such languages. The hardware group
will be of inestimable value, both in terms of their aid in implementing
the new hardware and also their feed-back on the final product; they have
to deal with the in-circuit emulator.

.SS(Comparison with Current LCDS Proposal)
My major concern with the proposal is one of emphasis. It appears
that the proposal is driven by the "LC" rather than the "DS".
I would structure our system so that the low cost version would
be useable--in fact, as I propose it, more useable that any currently
available system--but, if the user purchased options then
that augmented system would perform better. It is easier to
build "high" and downgrade, than to build "low" and upgrade.
For example, the bottleneck in my proposal is the mass storage device.
The system can run effectively from a mini-cassette (the system I
built at HP did in fact use such devices) however, if the 
user has a floppy, or better yet a hard disc then performance
is immediately upgraded. Files can be accessed faster; compilations and
assemblies are faster; however the general system characteristics 
do not change. The on-screen portion of the editing 
task would be just as reactive;
the debugging process would be identical. Only when the user accessed the
mass storage device would performance be effected.

If we limit ourselves to hardcopy input and output, if we specify
the system on the capabilities of low-speed peripherals, then we
have boxed ourselves into a development system which will not
grow gracefully  into high speed intercommunicating networks.
The system would have to be redesigned and rewritten; that would
be a dreadful waste.
.SS(General Comments and Summary)
Considering our time constraints,
it would be foolhardy to propose such a breadboard on a processor
which was not to appear in the final product, if the implementation
was to be done in assembly language. The ideas would surely transfer, but
their implementation would not.  Since we will use a machine independent
implemetation language, both the ideas %3and%1 the code will transfer;
nothing will be lost.

I am confident that I can develop the
tools in a timely fashion. I implemented a comparable version of the
Stanford editor in about three months on a 16-bit HP %7m%1-processor.
This processor was quite primitive, essentially a PDP-8. All character
generation and screen manipulation was done in software, all done without
benefit of even an assembler; the whole editor was contained in 4K 16-bit words.
The final system contained full on-line
documentation  on a HP-tape cartridge. Surely the task will be
simpler on the UCSD Pascal system, perhaps completing the module within
two months.
I would expect similar development times for the other tools; and
I would expect similar sizes (4K 16-bit words) for the modules.
This expectation is based on  experience. The system I have been describing
and demonstrating at Stanford is essentially the one which we
developed and implemented in 1965-1966 for a PDP-1 at Stanford.
That machine had 18-bit words, with only 4K available to a user.
The editor, assembler, Algol-like language, and machine language debugger
were each restricted to run in 4K. Surely we can expect to do as well.

The debugger, RAID%8¬TM%1, had access to the user's symbol table, giving us
 full symbolic debugging. The assembler, PASS%1, was one-pass
and block structured. The single pass gave us speed; the block-structure
helped immensely in making the assembly language modular, and
kept the residency requirements for symbol tables at a minimum.
The editor, TVEDIT, was also developed there.

This system was based on exactly the
same display principles which I am advocating for our development system.
The display philosophy was buried in the heart of the system.
The operating system, contained in 12K words, supported generalized
communication protocols for switching character streams between 
"sinks and sources" of the users. Its weakest link was its file
system which, alas, was shared by an IBM7090; therefore
we had to be compatible with it. In light of that
experience, I would advocate a UNIX-like file system. 
In fact, UNIX may be an excellent model for our operating system.

UNIX is a very popular operating system for the PDP-11 family of machines;
there are even versions of 
UNIX which run on minimally configured LSI-11's. 
Just as Pascal
has become a %3de facto%1 standard programming language, I  expect UNIX to become
the %3de facto%1 micro operating system. First, the user community has adopted
UNIX because of its powerful, friendly  interface; it is a non-obtrusive
operating system.
Second, UNIX is portable.
It has been "ported" to many diverse, even 
mutually hostile, machines: from Interdata to IBM. 
Though it is not
written in Pascal, it %3is%1 written in a semi-high level language
named "C"; therefore the effort in understanding, modifying,
or moving the operating system to new machines
is minimized.
Third, UNIX has been specified as the operating environment
for  the DOD-1 language: i.e. a PDP-11/70 running UNIX.  All this considered,
there may be substantial
benefit in advertising a UNIX-compatible system.

This paper has addressed several issues and hopefully will generate
some (heated) discussions. That is its point: we must start moving.
There is no more time for procrastination. We need to make some decisions
now and begin work.